I’ve been reading Chris Mooney’s commentary (here, here and here) about the Office of Management and Budget’s (OMB) guidelines for implementing the Data Quality Act. He points out that it has been turned into an impediment to environmental regulation by industry. I’ve looked at the example of the U.S. EPA, and as you’ll see below, there are examples that make his point.
At the same time, I’ve been a firm adherent of EPA’s Data Quality Objectives (DQO) process for developing study designs for investigating hazardous waste sites. Investigations that are planned without DQOs inevitably resemble very expensive, failed treasure hunts. Does my insistence that waste site investigations be structured around specific questions and defined decision points put me on the side of the OMB? Not sure – but what I do know is that DQO guidance is part EPA’s long-standing Quality System, which was incorporated into its data quality guidelines developed in 2002 in response to the Data Quality Act. So, I am a user of the Data Quality Act and will say that it can have a role in producing useable data for environmental decision-making.
However, I agree that the system is open to abuse to bogging down regulatory decision making, with the provision for challenges to the data developed by EPA. For example, the Perchlorate Study Group challenge of EPA’s risk assessment for perchlorate included requests for high-resolution images of slides of brain tissue from rats dosed with perchlorate, information on conditions under which the slides were stored and prepared, down to who sliced the tissue sections and what kind of a microtome did they use. To the naïve eye, this could look like the height of obstructionism. But these requests may make some sense. According to the PSG,
“As EPA knows, it is widely believed that the neurodevelopmental effects observed by EPA were artifacts of laboratory errors. This information is essential because there is a serious danger that differences in tissue compression during the histology could have created an apparent perchlorate effect by artifact alone.”
(Hah – so there is some recognition that there could be a neurodevelopmental effect from perchlorate exposure!)
Since the animal testing data is important in developing the Reference Dose for perchlorate (the basis for the 1 ppb action level in drinking water proposed in 2002), perhaps chasing down artifacts in pathology is important. However, while there is no discussion of how important is the data from that particular animal study (I won’t debate the issue right this minute), there is the point that noone, noone collects perfect data – not the government, industry or academia. Some researchers are better than others, but noone is perfect. Without some role for weight of evidence from multiple studies and some tolerance for making decisions and drawing conclusions under uncertainty, all of science would grind to a halt.
What’s unstated in this debate over data quality in regulatory decision making is the failure to come up with a reasonable mechanism for making decisions under uncertainty. I’ve struggled with how to articulate this view, but help has recently arrived in the form of a review published in Environmental Science and Policy. I’ve only reviewed the abstract of this article (the article is available for a fee), but it sounds very promising:
“. . . mischaracterization concerning the use of science in the U.S. policy process has lead to unreasonable expectations about the role that scientific information can play in the development of environmental and public health policies. This in turn has lead to implementation of misguided and self-defeating policy initiatives designed to ensure the objectivity or "soundness" of scientific inputs to the policy process.”
The authors argue that scientific findings cannot be stripped of their social contexts, such that scientific assessment conducted in support of policy making rarely, if ever, lends itself to descriptions of “objective” or “non-objective”. Therefore, policy initiatives such as the Data Quality Act, which are intended to assure objectivity, reliability, transparency, etc. etc., are in the end, exercises in futility. Their perspective on solving the problem is a bit vague, “scientific findings draw much of their validity through the context of their application in policy narratives”, but I think the diagnosis is sound – we’re expecting too much from science in solving environmental problems.
From David Appell (Quark Soup) today, a quote from the Sierra Club:
"Today the Environmental Protection Agency announced in its 2003 National Listing of Fish and Wildlife Advisories that 766,872 miles of America's rivers and 13,068,990 lake acres are contaminated with so much poisonous mercury that the fish aren't safe to eat -- that is a more than 60 percent increase for river miles and an eight percent increase for lake acres since the 2002 report."
David asks the question about whether or not the increased surface mean there is more mercury pollution, or if more monitoring is going on.
It’s more monitoring. According to EPA’s fact sheet:
“In 2003, the geographic extent of the states under advisory for mercury was 13,068,990 lake acres and 766,872 river miles. The increase in acres and river miles under advisory is a result of the issuance of statewide mercury advisories by Montana and Washington in 2003 and the addition of rivers to Wisconsin’s statewide advisory.” I wish I knew what was on the Sierra Club’s mind in publishing this factoid. Are they implying that there is more health risk from consuming mercury-contaminated fish than last year? Putting aside the fact that the increase is due to some regulatory agencies getting caught up on their paperwork, exposure to toxic substances just doesn’t work that way. The Sierra Club should be ashamed for such sloppy reasoning and for trivializing mercury risks in this manner. Their assigned remedial reading for this evening can be downloaded from here.
Courtesy of the Bloviator is this discussion of the intersection of environmental health risks and land use. He notes that land use planning or city planning is “one of the oldest means through which society can improve public health”, an important element of the Progressive Era that contributed to the foundation of the American Public Health Association.
The impetus for that post is an article published in Occupational and Environmental Medicine that presents an epidemiological study of the association of benzene exposures with occurrence of childhood leukemias (the abstract is published here; a review in WebMD here).
The journal makes non-subscribers cough up $25 for a copy of the paper. However according to the abstract, the paper documents a case control study that analyzed the association between potential environmental exposure to hydrocarbons and the risk of acute childhood leukaemia. A case control study is a retrospective comparison of exposures of persons with a disease (cases) versus persons without the disease (controls), and is one of the more robust study methods in epidemiology (some good resources on epidemiology can be found here and here). The authors concluded there was no clear association was seen between maternal occupational exposure to hydrocarbons during pregnancy and leukemia, or between residential traffic density and leukemia. They concluded there was an association with proximity or residences to gas stations or services garage during childhood and the risk of childhood leukemia, even when corrected for confounding factors. Benzene, a normal constituent in gasoline, is known to be associated with an increased incidence of leukemia.
The potential linkage between outdoor emissions of toxic air pollutants and risks to human health has been a major concern in developing the strategy of air toxics control in the U.S. The U.S. EPA’s National Air Toxics Assessment (NATA), released very quietly in 2002, provides a detailed description of these potential risks. However, what’s always been disquieting to me is that the air toxics strategy never seemed well integrated with studies of indoor air exposure to toxic air pollutants. Nearly 20 years ago, the EPA conducted a large program, the Total Exposure Assessment Methodology (TEAM) study, which showed that personal exposure to volatile organic compounds (such as benzene) was more strongly associated with indoor sources than outdoor ambient emission sources. This conclusion does not appear to have changed over time. It will be interesting to see how this issue plays out. What is clear is that further studies (air monitoring, modeling, more surveillance) will be needed to make a compelling case that better buffer zones are needed between sources of chemical emissions and residential areas. In the meantime, this becomes a classic precautionary principle/”sound science” debate – how much data are needed before we conclude that there should be changes in how we make land use decisions for industries or facilities that use chemicals.
“[t]he numbers of sufferers of brain diseases, including Alzheimer's, Parkinson's and motor neurone disease, have soared across the West in less than 20 years, scientists have discovered.
The alarming rise, which includes figures showing rates of dementia have trebled in men, has been linked to rises in levels of pesticides, industrial effluents, domestic waste, car exhausts and other pollutants, says a report in the journal Public Health.”
Public Health as the name of a journal doesn’t jump out at me. There’s a Journal of Public Health published in the UK (nothing on Alzheimer’s and pollution in the latest issue) and an American Journal of Public Health. However, after searching a couple of different places (National Library of Medicine, Emory University’s MedWeb), no Public Health emerges. The search will continue, and for now, we take matters as they are presented.
A couple of observations: first, the reported trebling in brain diseases occurred from the 1970s to the 1990s. The increase could be due to advances in diagnoses as much as from pollutant exposures. Second, some of these disorders have causes other than chemical exposures. For example, an inactive lifestyle, including watching too much television, has been offered as a risk factor for Alzheimer’s syndrome.
There are concerns about neurobehavioral and cognitive disorders from exposures to chemicals, including PCBs, chlorinated hydrocarbon pesticides, dioxins and furans, and polybrominated diphenyl ethers (PBDEs). But it’s probably too speculative to be attributing a generalized rise in incidence of brain disorders to chemical exposures.
As a public service, Impact Analysis: Adventures in Environmental Health will present, from time to time, tools for environmental health analysis.
After 9-11, the federal government took down numerous web sites providing useful information for assessing natural and man-made hazards, including offsite consequence analysis (OCA) data from Risk Management Plans (RMPs) for chemical facilities, formerly made available by the U.S. Environmental Protection Agency (EPA).
There is a dilemma between addressing a community’s right to know about the hazards from the ton cylinders of chlorine gas at the nearby wastewater treatment plant, and keeping information about how to create terrible clouds of toxic chemicals out of the hands of dangerous people. While the bias for secrecy may reduce risks of chemical terrorism, it has fostered a dangerous complacency on the part of the general public, who ultimately provide the oversight on authorities and facility operators handling hazardous chemicals. Not enough has been done to address the risks to communities from extremely hazardous substances.
However, for those willing to learn, the tools are available for developing hazard assessments. While not a replacement for the formerly publicly accessible information on extremely hazardous substances in your community, these tools can provide some increased understanding of those hazards and what your role can be in helping managing them.
CAMEO
CAMEO is a set of three computer programs – CAMEO, MARPLOT and ALOHA used to help community officials and first responders plan for chemical emergencies. CAMEO is a database of information on health and explosive hazards, firefighting and other protective measures for thousands of chemicals. MARPLOT is mapping software that can display sensitive receptors, response assets, roads, etc., and can also overlay contaminant plume from chemical releases onto these terrain features. ALOHA – Areal Location of Hazardous Atmospheres is an air dispersion model that can estimate concentrations in air from a hazardous chemical release.
CAMEO can be paired up with Acute Exposure Guideline Levels (AEGLs). AEGLs describe the human health risks resulting from short-term exposure to airborne chemicals.
AEGLs
Developed by a national advisory committee funded by the U.S. Environmental Protection Agency (EPA) and the Agency for Toxic Substances and Disease Registry (ATSDR), AEGLs were developed to help both federal and local authorities, as well as private companies, deal with emergencies involving spills, or other accidental exposures. A standard operating procedure (SOP) for calculating AEGLs is available from the National Academy Press. The AEGLs and supporting documentation are also published by NAP, though the values are available from EPA’s web site.
How these values work is:
The AEGL-1 concentration in air may result in detectable odor, or produce some noticeable symptoms, such as eye or respiratory irritation (sore throat, cough) or a headache. The “effects are not disabling” means that these exposures should not affect your ability to escape and move to fresh air. The symptoms should cease after moving to fresh air.
Exposure to the AEGL-2 concentration in air over the specified time period (30 minutes, 1 hour, etc.) may produce long-lasting or irreversible health effects, usually effects to the lungs or nervous system. Exposure may also affect your ability to escape (this is used for planning purposes for telling paramedics and firefighters just how long they have to evacuate people from an impacted area)
Exposure to the AEGL-3 concentration in air over the specified time period could be life-threatening – this tells the paramedics and firefighters that they should be evacuating those areas first. It also tells them to approach those areas in air-supplied respirators.
So, get started – there are plenty of examples of generic facilities in EPA’s RMP guidance to work with. It’s not the actual RMPs, but with that information and what you can find in the EPA’s Toxic Release Inventory (TRI), you can get some idea of the nature of chemicals hazards in your community.
What you can do about them to protect yourself – that’s for another day.
I don’t have a lot of use for Salon these days, and it is sometimes difficult to justify the subscription to it. But I struck gold this week, and there were several stories that held my interest. Salon did a review of Boiling Point” by Ross Gelbspan, a book making the argument that adverse global climate change isn’t something in our distant mid-21st century future, but a phenomenon we’re experiencing now.
Reading “Our Ecological Footprint” crystallized my sensation of a potentially oncoming global trainwreck of resource scarcity and climate change. The conceptual basis and assumptions underlying ecological footprint analysis are open to challenge, and there are many challengers. However, it’s hard to argue with the footprint as a powerful symbol of ecological overshoot. While I’m glad for those fighting the good fight over greenhouse gas emissions and global climate research, it strikes me as a bit of a lost cause to argue about whether or not the hockey stick is the correct global climate change model. We should turn our attention to environmental security – maintaining biodiversity, protecting critical infrastructure along coastlines, preserving public health and minimizing resource depletion – all of these things are worth doing, even if there is no future dominated by inhospitable climate.
Environmental security is a dimension that is completely ignored in our global war on terrorism, a policy choice that is certainly going to hurt us someday. Salon quotes Gelbspan:
"The continuing indifference by the United States to atmospheric warming -- since this country generates one-fourth of the world's emissions with 5 percent of its people -- will almost guarantee more anti-U.S. attacks from people whose crops are destroyed by weather extremes, whose populations are afflicted by epidemics of infectious disease, and whose borders are overrun by environmental refugees".
Salon says, “Gelbspan argues that while Americans fret about terrorism, a much worse nightmare is accelerating,” where that nightmare is global climate change. It’s not an either/or proposition. A case has been made for the linkage between environmental impacts and war or terrorism, which shows the hollowness of our current homeland security/hard power approach to the global war on terrorism.
I’ve started re-reading about climate change and human health, since this is an environmental health blog. Environmental Health Perspectives, May 2001 supplement has a survey of the literature that is a good starting point (EHP’s web site is down now – I’ll link to it later). This is an issue that outweighs most of the environmental health problems that people wring their hands over, such as dioxins. I will be returning to it from time to time.
I’m starting to write about the war of ideologies between the “precautionary principle” and “sound science” and I’ve settled on the regulatory history of vinyl chloride as a good case study. I found a useful history of vinyl chloride in “Deceit and Denial: The Deadly Politics of Industrial Pollution”, by Gerald Markowitz and David Rosner, published by the University of California Press in 2002. Focusing on products made from lead and vinyl chloride, the theme of the book is how U.S. industries attempted to conceal information about adverse health effects from the public and workers, and obstruct or influence actions by the federal government to regulate exposures to those two substances. It’s pretty clear that they line up on the side of labor and citizen activists on this issue, but the book is free of the outrage and anger that typically permeates environmentalist literature. I’ve only read the chapters dealing with vinyl chloride so far, and have found those to be a well-written and carefully documented overview of the topic. It’s going to be a good resource and I am fortunate to live near a good public library that had a copy on hand.
“. . . perchlorate in drinking water sources around the nation has drawn the attention of scientists and public health advocates who say that even small doses of the chemical threaten people’s health.”
Perchlorate has been drawing attention for many, many years now because it has been difficult for the U.S. Environmental Protection Agency to estimate a level in drinking water that protects public health. It’s hard to say why it’s drawing attention now.
The one new piece of information I gleaned from the article was that perchlorate controversy is going to be tossed in the lap of the National Academy of Sciences. That seems to be happening to EPA a lot these days. For the past 15-20 years they’ve been trying to develop scientifically-based health standards for dioxin and trichloroethylene, without success. Those chemicals are also going to the NAS. I don’t find that to be a comforting sign because it seems to reward indecisiveness and dithering by the agency.
What’s next if the NAS can’t provide some good recommendations? Take it to Congress? I can’t wait to see that: “quick, we need to put aside the war on terror so we can debate this drinking water standard for TCE”.
Looking at the Numbers
While the debate continues about a definitive protective level, EPA provides to its enforcement personnel interim guidance recommending perchlorate concentrations of 4 to 18 parts-per-billion (ppb) as action levels in groundwater supplies – these potentially represent targets for monitoring or groundwater cleanup. Marianne Lamont Horinko, the agency’s assistant administrator for the Office of Solid Waste and Emergency Response, encouraged agency personnel to “carefully consider the low end” of that range. [A quick aside: those drinking water concentrations correspond to the “provisional Reference Dose (RfD)” of 0.0001 to 0.0005 mg/kg-day, or milligrams of perchlorate ingested per kilogram of body weight each day. A discussion of what the Reference Dose means is a topic for another day, but here’s where you can go if you want to read ahead.]
For an agency project manager overseeing a groundwater cleanup at a Superfund site, “carefully consider the low end” of that range, sounds like the code for “direct the responsible party to use the low end of the range for a cleanup level unless they really hold your feet to the fire or threaten to sue us or something. . . .” Not the most systematic approach to risk management.
Perchlorate provides a good example in site cleanup of a process that is just transparent enough to unnerve the general public, but not sufficiently transparent to inform. EPA says that a concentration in drinking water from 4 to18 ppb is protective enough to use as a cleanup level (for now. . .). For those individuals who have not been dragged through the onerous process of developing a health-based action level, this brings up some good questions, such as “which is it, 4 or 18?” If 4 ppb is needed to protect against adverse health effects, then how can 18 ppb be protective? If 18 is ok, why do we need 4?
When EPA revised its health risk assessment for perchlorate in 2002, a protective level in drinking water dropped to 1 ppb. So now what’s the deal with using 4 and 18 as cleanup levels? Complicating matters a bit further, the state of California proposes in 2004 to use 6 ppb perchlorate as Public Health Goal (PHG), and eventually a drinking water standard. In practical terms, there is no difference between 4 ppb and 6 ppb as a cleanup level in groundwater, though that point is probably not obvious to most people. More about this chronology can be found here.
Clouding the issue further, EPA in 1995 proposed a drinking water action level of 32 ppb, which predates its 4 to 18 ppb recommendation. A few years ago, an exposure study performed with human volunteers showed that 200 ppb in drinking water represented a no-observed-effect-level for perchlorate in adults, which reportedly was argued as a protective level by industry. That level arguably doesn’t address the adverse effects of concern, which is interference with iodide uptake by the thyroid gland in pregnant women, with possible effects to neurological development in the fetus.
An industry-DOD group, the Perchlorate Study Group had, in a 24 May 2004 meeting with the National Academy of Sciences, presented recent unpublished research that preliminarily concluded that a perchlorate level of 110 ppb had no effect on pregnant women and infants. I haven’t seen where that value comes from, but the action level in drinking water that seems to make the best use of the considerable scientific database, and has been developed using a transparent process is 70 ppb. But with all of the zigzagging that has gone on, is anyone prepared to find it believable? There was a thought process underlying each of these numbers, but when examined together, they show more of a regulatory and scientific muddle which doesn’t inspire a lot of confidence that there is a plan for addressing perchlorate-contaminated groundwater. Matters don’t look quite so distressing when the facts are ordered into a coherent narrative, but that doesn’t seem to have happened yet. The story in Environmental Health Perspectives is a good summary of the recent facts, but doesn’t address why perchlorate history looks like this, or what it means to people who live where there is perchlorate-contaminated groundwater.